BFGS with Update Skipping and Varying Memory

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

BFGS with Update Skipping and Varying Memory

We give conditions under which limited-memory quasi-Newton methods with exact line searches will terminate in n steps when minimizing n-dimensional quadratic functions. We show that although all Broyden family methods terminate in n steps in their full-memory versions, only BFGS does so with limited-memory. Additionally, we show that full-memory Broyden family methods with exact line searches t...

متن کامل

Generalizations of the limited-memory BFGS method based on the quasi-product form of update

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class of variable metric updates are investigated; three of them use the same number of stored vectors as the limitedmemory BFGS method. Moreover, one ...

متن کامل

Solving Limited-Memory BFGS Systems with Generalized Diagonal Updates

In this paper, we investigate a formula to solve systems of the form (Bk + D)x = y, where Bk comes from a limited-memory BFGS quasi-Newton method and D is a diagonal matrix with diagonal entries di,i ≥ σ for some σ > 0. These types of systems arise naturally in large-scale optimization. We show that provided a simple condition holds on B0 and σ, the system (Bk + D)x = y can be solved via a recu...

متن کامل

Global convergence of online limited memory BFGS

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

متن کامل

A Numerical Study of Limited Memory BFGS

The application of quasi-Newton methods is widespread in numerical optimization. Independently of the application, the techniques used to update the BFGS matrices seem to play an important role in the performance of the overall method. In this paper we address precisely this issue. We compare two implementations of the limited memory BFGS method for large-scale unconstrained problems. They diie...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 1998

ISSN: 1052-6234,1095-7189

DOI: 10.1137/s1052623496306450